skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Peters, Christina"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)
    In dual-phase time-projection chambers there are photosensor arrays arranged to allow for inference of the positions of interactions within the detector. If there is a gap in data left by a broken or saturated photosensors, the inference of the position is less precise and less accurate. As we are unable to repair or replace photosensors once the experiment has begun, we develop methods to estimate the missing signals. Our group is developing a probabilistic graphical model of the correlations between the number of photons detected by adjacent photosensors that represents the probability distribution over photons detected as a Poisson distribution. Determining the posterior probability distribution over a number of photons detected by a sensor then requires integration over a multivariate Poisson distribution, which is computationally intractable for high-dimensions. In this work, we present an approach to quickly calculate and integrate over a multidimensional Poisson distribution. Our approach uses Zarr, a Python array compression package, to manage large multi-dimensional arrays and approximates the log factorial to quickly calculate the Poisson distribution without overflow. 
    more » « less
  2. This work proposes a domain-informed neural network architecture for experimental particle physics, using particle interaction localization with the time-projection chamber (TPC) technology for dark matter research as an example application. A key feature of the signals generated within the TPC is that they allow localization of particle interactions through a process called reconstruction (i.e., inverse-problem regression). While multilayer perceptrons (MLPs) have emerged as a leading contender for reconstruction in TPCs, such a black-box approach does not reflect prior knowledge of the underlying scientific processes. This paper looks anew at neural network-based interaction localization and encodes prior detector knowledge, in terms of both signal characteristics and detector geometry, into the feature encoding and the output layers of a multilayer (deep) neural network. The resulting neural network, termed Domain-informed Neural Network (DiNN), limits the receptive fields of the neurons in the initial feature encoding layers in order to account for the spatially localized nature of the signals produced within the TPC. This aspect of the DiNN, which has similarities with the emerging area of graph neural networks in that the neurons in the initial layers only connect to a handful of neurons in their succeeding layer, significantly reduces the number of parameters in the network in comparison to an MLP. In addition, in order to account for the detector geometry, the output layers of the network are modified using two geometric transformations to ensure the DiNN produces localizations within the interior of the detector. The end result is a neural network architecture that has 60% fewer parameters than an MLP, but that still achieves similar localization performance and provides a path to future architectural developments with improved performance because of their ability to encode additional domain knowledge into the architecture. 
    more » « less